39,244 research outputs found

    Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso

    Full text link
    We present exponential finite-sample nonasymptotic deviation inequalities for the SAA estimator's near-optimal solution set over the class of stochastic optimization problems with heavy-tailed random \emph{convex} functions in the objective and constraints. Such setting is better suited for problems where a sub-Gaussian data generating distribution is less expected, e.g., in stochastic portfolio optimization. One of our contributions is to exploit \emph{convexity} of the perturbed objective and the perturbed constraints as a property which entails \emph{localized} deviation inequalities for joint feasibility and optimality guarantees. This means that our bounds are significantly tighter in terms of diameter and metric entropy since they depend only on the near-optimal solution set but not on the whole feasible set. As a result, we obtain a much sharper sample complexity estimate when compared to a general nonconvex problem. In our analysis, we derive some localized deterministic perturbation error bounds for convex optimization problems which are of independent interest. To obtain our results, we only assume a metric regular convex feasible set, possibly not satisfying the Slater condition and not having a metric regular solution set. In this general setting, joint near feasibility and near optimality are guaranteed. If in addition the set satisfies the Slater condition, we obtain finite-sample simultaneous \emph{exact} feasibility and near optimality guarantees (for a sufficiently small tolerance). Another contribution of our work is to present, as a proof of concept of our localized techniques, a persistent result for a variant of the LASSO estimator under very weak assumptions on the data generating distribution.Comment: 34 pages. Some correction

    Width and extremal height distributions of fluctuating interfaces with window boundary conditions

    Full text link
    We present a detailed study of squared local roughness (SLRDs) and local extremal height distributions (LEHDs), calculated in windows of lateral size ll, for interfaces in several universality classes, in substrate dimensions ds=1d_s = 1 and ds=2d_s = 2. We show that their cumulants follow a Family-Vicsek type scaling, and, at early times, when ξ≪l\xi \ll l (ξ\xi is the correlation length), the rescaled SLRDs are given by log-normal distributions, with their nnth cumulant scaling as (ξ/l)(n−1)ds(\xi/l)^{(n-1)d_s}. This give rise to an interesting temporal scaling for such cumulants ⟨wn⟩c∼tγn\left\langle w_n \right\rangle_c \sim t^{\gamma_n}, with γn=2nβ+(n−1)ds/z=[2n+(n−1)ds/α]β\gamma_n = 2 n \beta + {(n-1)d_s}/{z} = \left[ 2 n + {(n-1)d_s}/{\alpha} \right] \beta. This scaling is analytically proved for the Edwards-Wilkinson (EW) and Random Deposition interfaces, and numerically confirmed for other classes. In general, it is featured by small corrections and, thus, it yields exponents γn\gamma_n's (and, consequently, α\alpha, β\beta and zz) in nice agreement with their respective universality class. Thus, it is an useful framework for numerical and experimental investigations, where it is, usually, hard to estimate the dynamic zz and mainly the (global) roughness α\alpha exponents. The stationary (for ξ≫l\xi \gg l) SLRDs and LEHDs of Kardar-Parisi-Zhang (KPZ) class are also investigated and, for some models, strong finite-size corrections are found. However, we demonstrate that good evidences of their universality can be obtained through successive extrapolations of their cumulant ratios for long times and large ll's. We also show that SLRDs and LEHDs are the same for flat and curved KPZ interfaces.Comment: 11 pages, 10 figures, 4 table
    • …
    corecore